Inference in Dynamic Probabilistic Relational Models
نویسندگان
چکیده
Stochastic processes that involve the creation and modification of objects and relations over time are widespread, but relatively poorly studied. For example, accurate fault diagnosis in factory assembly processes requires inferring the probabilities of erroneous assembly operations, but doing this efficiently and accurately is difficult. Modeled as dynamic Bayesian networks, these processes have discrete variables with very large domains and extremely high dimensionality. Recently, Sanghai et al. (2003) introduced dynamic probabilistic relational models (DPRMs) to model probabilistic relational domains that change with time. They also proposed a form of Rao-Blackwellized particle filtering which performs well on problems of this type, but relies on very restrictive assumptions. In this paper we lift these assumptions, developing two forms of particle filtering that are in principle applicable to any relational stochastic process. The first one uses an abstraction lattice over relational variables to smooth the particle filter’s estimates. The second employs kernel density estimation using a kernel function specifically designed for relational domains. Experiments show these two methods greatly outperforms standard particle filtering on the task of assembly plan execution monitoring.
منابع مشابه
Dynamic Probabilistic Relational Models
Intelligent agents must function in an uncertain world, containing multiple objects and relations that change over time. Unfortunately, no representation is currently available that can handle all these issues, while allowing for principled and efficient inference. This paper addresses this need by introducing dynamic probabilistic relational models (DPRMs). DPRMs are an extension of dynamic Ba...
متن کاملProbabilistic Backward and Forward Reasoning in Stochastic Relational Worlds
Inference in graphical models has emerged as a promising technique for planning. A recent approach to decision-theoretic planning in relational domains uses forward inference in dynamic Bayesian networks compiled from learned probabilistic relational rules. Inspired by work in non-relational domains with small state spaces, we derive a backpropagation method for such nets in relational domains ...
متن کاملExploiting sparsity and sharing in probabilistic sensor data models
Probabilistic sensor models defined as dynamic Bayesian networks can possess an inherent sparsity that is not reflected in the structure of the network. Classical inference algorithms like variable elimination and junction tree propagation cannot exploit this sparsity. Also, they do not exploit the opportunities for sharing calculations among different time slices of the model. We show that, us...
متن کاملElimination Ordering in Lifted First-Order Probabilistic Inference
Various representations and inference methods have been proposed for lifted probabilistic inference in relational models. Many of these methods choose an order to eliminate (or branch on) the parameterized random variables. Similar to such methods for non-relational probabilistic inference, the order of elimination has a significant role in the performance of the algorithms. Since finding the b...
متن کاملLifted Inference and Learning in Statistical Relational Models
Statistical relationalmodels combine aspects of first-order logic andprobabilistic graphical models, enabling them to model complex logical and probabilistic interactions between large numbers of objects. This level of expressivity comes at the cost of increased complexity of inference, motivating a new line of research in lifted probabilistic inference. By exploiting symmetries of the relation...
متن کامل